Measuring Bayesian Robustness Using Rényi Divergence
نویسندگان
چکیده
This paper deals with measuring the Bayesian robustness of classes contaminated priors. Two different priors in neighborhood elicited prior are considered. The first one is well-known ϵ-contaminated class, while second geometric mixing class. proposed measure based on computing curvature Rényi divergence between posterior distributions. Examples used to illustrate results by using simulated and real data sets.
منابع مشابه
PAC-Bayesian Bounds based on the Rényi Divergence
We propose a simplified proof process for PAC-Bayesian generalization bounds, that allows to divide the proof in four successive inequalities, easing the “customization” of PAC-Bayesian theorems. We also propose a family of PAC-Bayesian bounds based on the Rényi divergence between the prior and posterior distributions, whereas most PACBayesian bounds are based on the KullbackLeibler divergence....
متن کاملRényi Divergence Variational Inference
This paper introduces the variational Rényi bound (VR) that extends traditional variational inference to Rényi’s α-divergences. This new family of variational methods unifies a number of existing approaches, and enables a smooth interpolation from the evidence lower-bound to the log (marginal) likelihood that is controlled by the value of α that parametrises the divergence. The reparameterizati...
متن کاملRényi divergence and majorization
Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...
متن کاملRényi Divergence and Kullback-Leibler Divergence
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...
متن کاملVariational Inference with Rényi Divergence
This paper introduces the variational Rényi bound (VR) that extends traditional variational inference to Rényi’s α-divergences. This new family of variational methods unifies a number of existing approaches, and enables a smooth interpolation from the evidence lower-bound to the log (marginal) likelihood that is controlled by the value of α that parametrises the divergence. The reparameterizati...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Stats
سال: 2021
ISSN: ['2571-905X']
DOI: https://doi.org/10.3390/stats4020018